OP-KNN: Method and Applications

نویسندگان

  • Qi Yu
  • Yoan Miché
  • Antti Sorjamaa
  • Alberto Guillén
  • Amaury Lendasse
  • Eric Séverin
چکیده

This paper presents a methodology named Optimally Pruned K-Nearest Neighbors (OP-KNNs) which has the advantage of competing with state-of-the-art methods while remaining fast. It builds a one hidden-layer feedforward neural network using K-Nearest Neighbors as kernels to perform regression. Multiresponse Sparse Regression (MRSR) is used in order to rank each kth nearest neighbor and finally Leave-One-Out estimation is used to select the optimal number of neighbors and to estimate the generalization performances. Since computational time of this method is small, this paper presents a strategy using OP-KNN to perform Variable Selection which is tested successfully on eight real-life data sets from different application fields. In summary, the most significant characteristic of this method is that it provides good performance and a comparatively simple model at extremely high-learning speed.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A faster model selection criterion for OP-ELM and OP-KNN: Hannan-Quinn criterion

The Optimally Pruned Extreme Learning Machine (OPELM) and Optimally Pruned K-Nearest Neighbors (OP-KNN) algorithms use the a similar methodology based on random initialization (OP-ELM) or KNN initialization (OP-KNN) of a Feedforward Neural Network followed by ranking of the neurons; ranking is used to determine the best combination to retain. This is achieved by Leave-One-Out (LOO) crossvalidat...

متن کامل

Tabu Search with Delta Test for Time Series Prediction using OP-KNN

This paper presents a working combination of input selection strategy and a fast approximator for time series prediction. The input selection is performed using Tabu Search with the Delta Test. The approximation methodology is called Optimally-Pruned k -Nearest Neighbors (OP-KNN), which has been recently developed for fast and accurate regression and classification tasks. In this paper we demon...

متن کامل

Optimizing Collaborative Filtering by Interpolating the Individual and Group Behaviors

Collaborative filtering has been very successful in both research and E-commence applications. One of the most popular collaborative filtering algorithms is the k-Nearest Neighbor (KNN) method, which finds k nearest neighbors for a given user to predict his interests. Previous research on KNN algorithm usually suffers from the data sparseness problem, because the quantity of items users voted i...

متن کامل

KNN Model-Based Approach in Classification

The k-Nearest-Neighbours (kNN) is a simple but effective method for classification. The major drawbacks with respect to kNN are (1) its low efficiency being a lazy learning method prohibits it in many applications such as dynamic web mining for a large repository, and (2) its dependency on the selection of a “good value” for k. In this paper, we propose a novel kNN type method for classificatio...

متن کامل

KNN-CF Approach: Incorporating Certainty Factor to kNN Classification

 Abstract—KNN classification finds k nearest neighbors of a query in training data and then predicts the class of the query as the most frequent one occurring in the neighbors. This is a typical method based on the majority rule. Although majority-rule based methods have widely and successfully been used in real applications, they can be unsuitable to the learning setting of skewed class distr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Adv. Artificial Neural Systems

دوره 2010  شماره 

صفحات  -

تاریخ انتشار 2010